Thirty Days of Metal — Day 17: Assets
This series of posts is my attempt to present the Metal graphics programming framework in small, bite-sized chunks for Swift app developers who haven’t done GPU programming before.
If you want to work through this series in order, start here. To download the sample code for this article, go here.
So far, when we have needed meshes to draw, we have asked Model I/O to generate basic 3D shapes. In practice, we often want to draw more sophisticated and detailed 3D models produced by an artist. In this article, we will look at how to start loading geometry and texture data from 3D model files.
An asset is any file that contains data authored separately from the application source code. This broad definition includes everything from images to sound clips to videos. A 3D model is a type of asset that contains meshes, materials, animations, and other data that can be rendered in 3D. For the purposes of this article, we will use “3D model” and “asset” interchangeably.
There are many model file formats in current use: Wavefront OBJ, Khronos Group’s glTF, Pixar’s USD and Apple’s USDZ, and Autodesk’s FBX, to name a few. Each have different features and store data differently, but all are capable of storing node hierarchies, meshes, and materials. We will use OBJ in this article because of its simplicity. Be aware that in recent years, it has been superseded by more sophisticated formats like USD and glTF.
We will continue to use the MDLMesh and MTKMesh classes from Model I/O and MetalKit to store mesh data, but we will get our meshes in a different way this time. We will use the MDLAsset class to load a 3D mesh and its associated texture, then convert them into forms that are suitable for rendering with Metal.
Getting 3D Models
Even if you can’t afford an artist to build custom 3D models for you, there are many sources of free and affordable models online. The largest of these is Sketchfab. Another, newer entrant is Poly Haven. Consider browsing these sites and finding a model you like, to customize the sample app to your liking.
I have selected a free model named “Spot,” created by Keenan Crane of Carnegie Mellon University.
Spot the cow is a simple model consisting of a single mesh and a single texture, which makes her a perfect candidate for our initial exploration of asset loading with Model I/O.
Introducing MDLAsset
MDLAsset is a versatile class that can hold many different types of objects: lights, cameras, meshes, materials, and more.
To create an asset, we need three things: a file URL that points to the model on disk, a vertex descriptor, and a buffer allocator.
As the mesh creation code is now somewhat lengthy, I extracted it into a method named loadAsset().
We have used vertex descriptors and buffer allocators extensively, but I’ll repeat the variable declarations here for completeness:
let allocator = MTKMeshBufferAllocator(device: device)let mdlVertexDescriptor = MDLVertexDescriptor()
mdlVertexDescriptor.vertexAttributes[0].name = MDLVertexAttributePosition
mdlVertexDescriptor.vertexAttributes[0].format = .float3
mdlVertexDescriptor.vertexAttributes[0].offset = 0
mdlVertexDescriptor.vertexAttributes[0].bufferIndex = 0
mdlVertexDescriptor.vertexAttributes[1].name = MDLVertexAttributeNormal
mdlVertexDescriptor.vertexAttributes[1].format = .float3
mdlVertexDescriptor.vertexAttributes[1].offset = 12
mdlVertexDescriptor.vertexAttributes[1].bufferIndex = 0
mdlVertexDescriptor.vertexAttributes[2].name = MDLVertexAttributeTextureCoordinate
mdlVertexDescriptor.vertexAttributes[2].format = .float2
mdlVertexDescriptor.vertexAttributes[2].offset = 24
mdlVertexDescriptor.vertexAttributes[2].bufferIndex = 0
mdlVertexDescriptor.bufferLayouts[0].stride = 32vertexDescriptor = MTKMetalVertexDescriptorFromModelIO(mdlVertexDescriptor)!
We create an MTKMeshBufferAllocator so Model I/O can write model data directly into MTLBuffers. Then we define a vertex descriptor that has positions, normals, and texture coordinates, just as we did when generating our sphere and box meshes. Finally, we convert the Model I/O vertex descriptor to an MTLVertexDescriptor and store it to use when we create our render pipeline state.
Now we just need an asset URL. To make a model available to your app, add it to your project and make sure it is included in the app target’s Copy Files phase. In the case of the Spot model, there are three files: an OBJ file containing the mesh, a PNG file containing the texture, and an MTL file containing material definitions.
Model I/O will automatically find other files referenced by a model, so we only need to create a URL pointing to the main OBJ file:
let assetURL = Bundle.main.url(
forResource: "spot_triangulated",
withExtension: "obj")Now we can instantiate an MDLAsset with these parameters:
let mdlAsset = MDLAsset(url: assetURL,
vertexDescriptor: mdlVertexDescriptor,
bufferAllocator: allocator)MDLAsset does not automatically “resolve” texture file references, so we call loadTextures() to ask it to search through the materials in the file and locate any image files referenced by them.
mdlAsset.loadTextures()Since a Model I/O asset can contain many different types, we need to ask explicitly for objects by their type. We do this by calling the childObjects(of:) method:
let meshes = mdlAsset.childObjects(of: MDLMesh.self) as? [MDLMesh]
guard let mdlMesh = meshes?.first else {
fatalError("Did not find any meshes in the Model I/O asset")
}To load the textures referenced by the model file, we will again use MTKTextureLoader.
let textureLoader = MTKTextureLoader(device: device)
let options: [MTKTextureLoader.Option : Any] = [
.textureUsage : MTLTextureUsage.shaderRead.rawValue,
.textureStorageMode : MTLStorageMode.private.rawValue,
.origin : MTKTextureLoader.Origin.bottomLeft.rawValue
]Materials in Model I/O are represented by the MDLMaterial. Each material has numerous instances of MDLMaterialProperty, each one representing a material property such as base color, roughness, or metalness.
For the moment, we will assume that each mesh has a single material, held by its first submesh.
let firstSubmesh = mdlMesh.submeshes?.firstObject as? MDLSubmesh
let material = firstSubmesh?.materialSpecifically, we care about its base color. If this material property contains a texture, we get a file URL pointing to it and use our texture loader to turn it into an MTLTexture:
var texture: MTLTexture?
if let baseColorProperty = material?.property(
with: MDLMaterialSemantic.baseColor)
{
if baseColorProperty.type == .texture,
let textureURL = baseColorProperty.urlValue
{
texture = try? textureLoader.newTexture(
URL: textureURL,
options: options)
}
}Assuming our mesh and texture were loaded successfully, we can convert the mesh to an MTKMesh, which we already know how to draw.
let mesh = try! MTKMesh(mesh: mdlMesh, device: device)We create a node to hold the mesh, and assign the base color texture to it. Then we hold references to it so we can update and render it later in our draw method.
cowNode = Node(mesh: mesh)
cowNode.texture = texturenodes = [cowNode]
This concludes our loadAsset() method. The rest of the code is largely unchanged.
Running the sample app, we can see Spot the cow spinning in all her textured glory.
We have now used textures to add a bit more surface detail and realism to our scenes, but we are still far from photorealism. In the next article, we will consider how to add lights to our scenes.